19 research outputs found

    2D Unsteady Routing and Flood Inundation Mapping for Lower Region of Brazos River Watershed

    Full text link
    Present study uses two dimensional flow routing capabilities of hydrologic engineering center\u27s river analysis system (HEC-RAS) for flood inundation mapping in lower region of Brazo River watershed subjected to frequent flooding. For analysis, river reach length of 20 km located at Richmond, Texas, was considered. Detailed underlying terrain information available from digital elevation model of 1/9-arc second resolution was used to generate the two-dimensional (2D) flow area and flow geometrics. Streamflow data available from gauging station USGS08114000 was used for the full unsteady flow hydraulic modeling along the reach. Developed hydraulic model was then calibrated based on the manning\u27s roughness coefficient for the river reach by comparison with the downstream rating curve. Corresponding water surface elevation and velocity distribution obtained after 2D hydraulic simulation were used to determine the extent of flooding. For this, RAS mapper\u27s capabilities of inundation mapping in HEC-RAS itself were used. Mapping of the flooded areas based on inflow hydrograph on each time step were done in RAS mapper, which provided the spatial distribution of flow. The results from this study can be used for flood management as well as for making land use and infrastructure development decisions

    EVALUATION OF STATE-OF-THE-ART PRECIPITATION ESTIMATES: AN APPROACH TO VALIDATE MULTI-SATELLITE PRECIPITATION ESTIMATES

    No full text
    Availability of precipitation data is very important in every aspect related to hydrology. Readings from the ground stations are reliable and are used in hydrological models to do various analysis. However, the predictions are always associated with uncertainties due to the limited number of ground stations, which requires interpolation of the data. Meanwhile, groundbreaking approach in capturing precipitation events from vantage point through satellites in space has created a platform to not only merge ground data with satellite estimates to produce more accurate result, but also to find the data where ground stations are not available or scarcely available. Nevertheless, the data obtained through these satellite missions needs to be verified on its temporal and spatial resolution as well as the uncertainties associated before we make any decisions on its basis. This study focuses on finding and evaluating data obtained from two multi-satellite precipitation measurements missions: i) Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) ii) Global Precipitation Measurement (GPM) mission. GPM is the latest mission launched on Feb 28, 2014 after the successful completion of TRMM mission which collected valuable data for 17 years since its launch in November 1997. Both near real time and final version precipitation products for TMPA and GPM are considered for this study. Two study areas representing eastern and western parts of the United States of America (USA) are considered: i) Charlotte (CLT) in North Carolina ii) San Francisco (SF) in California. Evaluation is carried out for daily accumulated rainfall estimates and single rainfall events. Statistical analysis and error categorization of daily accumulated rainfall estimates were analyzed in two parts: i) Ten yeas data available for TMPA products were considered for historical analysis ii) Both TMPA and GPM data available for a ten-month common period was considered for GPM Era analysis. To study how well the satellite estimates with their finest temporal and spatial resolution capture single rainfall event and to explore their engineering application potential, an existing model of SF watershed prepared in Infoworks Integrated Catchment Model (ICM) was considered for hydrological simulation. Infoworks ICM is developed and maintained by Wallingford Software in the UK and SF watershed model is owned by San Francisco Public Works (SFPW). The historical analysis of TMPA products suggested overestimation of rainfall in CLT region while underestimation in SF region. This underestimation was largely associated with missed-rainfall events and negative hit events in SF. This inconsistency in estimation was evident in GPM products as well. However, in the study of single rainfall events with higher magnitude of rainfall depth in SF, the total rainfall volume and runoff volume generated in the watershed were over-estimated. Hence, satellite estimates in general tends to miss rainfall events of lower magnitude and over-estimate rainfall events of higher magnitude. From statistical analysis of GPM Era data, it was evident that GPM has been able to correct this inconsistency to some extent where it minimized overestimation in CLT region and minimized negative error due to underestimation in SF. GPM products fairly captured the hydrograph shape of outflow in SF watershed in comparison to TMPA. From this study, it can be concluded that even though GPM precipitation estimates could not quiet completely replace ground rain gage measurements as of now, with the perpetual updating of algorithms to correct its associated error, it holds realistic engineering application potential in the near future

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Separation of track- and shower-like energy deposits in ProtoDUNE-SP using a convolutional neural network

    No full text
    International audienceLiquid argon time projection chamber detector technology provides high spatial and calorimetric resolutions on the charged particles traversing liquid argon. As a result, the technology has been used in a number of recent neutrino experiments, and is the technology of choice for the Deep Underground Neutrino Experiment (DUNE). In order to perform high precision measurements of neutrinos in the detector, final state particles need to be effectively identified, and their energy accurately reconstructed. This article proposes an algorithm based on a convolutional neural network to perform the classification of energy deposits and reconstructed particles as track-like or arising from electromagnetic cascades. Results from testing the algorithm on experimental data from ProtoDUNE-SP, a prototype of the DUNE far detector, are presented. The network identifies track- and shower-like particles, as well as Michel electrons, with high efficiency. The performance of the algorithm is consistent between experimental data and simulation

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Scintillation light detection in the 6-m drift-length ProtoDUNE Dual Phase liquid argon TPC

    No full text
    DUNE is a dual-site experiment for long-baseline neutrino oscillation studies, neutrino astrophysics and nucleon decay searches. ProtoDUNE Dual Phase (DP) is a 6 ×\times  6 ×\times  6 m3^3 liquid argon time-projection-chamber (LArTPC) that recorded cosmic-muon data at the CERN Neutrino Platform in 2019–2020 as a prototype of the DUNE Far Detector. Charged particles propagating through the LArTPC produce ionization and scintillation light. The scintillation light signal in these detectors can provide the trigger for non-beam events. In addition, it adds precise timing capabilities and improves the calorimetry measurements. In ProtoDUNE-DP, scintillation and electroluminescence light produced by cosmic muons in the LArTPC is collected by photomultiplier tubes placed up to 7 m away from the ionizing track. In this paper, the ProtoDUNE-DP photon detection system performance is evaluated with a particular focus on the different wavelength shifters, such as PEN and TPB, and the use of Xe-doped LAr, considering its future use in giant LArTPCs. The scintillation light production and propagation processes are analyzed and a comparison of simulation to data is performed, improving understanding of the liquid argon properties.DUNE is a dual-site experiment for long-baseline neutrino oscillation studies, neutrino astrophysics and nucleon decay searches. ProtoDUNE Dual Phase (DP) is a 6x6x6m3 liquid argon time-projection-chamber (LArTPC) that recorded cosmic-muon data at the CERN Neutrino Platform in 2019-2020 as a prototype of the DUNE Far Detector. Charged particles propagating through the LArTPC produce ionization and scintillation light. The scintillation light signal in these detectors can provide the trigger for non-beam events. In addition, it adds precise timing capabilities and improves the calorimetry measurements. In ProtoDUNE-DP, scintillation and electroluminescence light produced by cosmic muons in the LArTPC is collected by photomultiplier tubes placed up to 7 m away from the ionizing track. In this paper, the ProtoDUNE-DP photon detection system performance is evaluated with a particular focus on the different wavelength shifters, such as PEN and TPB, and the use of Xe-doped LAr, considering its future use in giant LArTPCs. The scintillation light production and propagation processes are analyzed and a comparison of simulation to data is performed, improving understanding of the liquid argon properties

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems that facilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment.This document describes the conceptual design for the Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE). The goals of the experiment include 1) studying neutrino oscillations using a beam of neutrinos sent from Fermilab in Illinois to the Sanford Underground Research Facility (SURF) in Lead, South Dakota, 2) studying astrophysical neutrino sources and rare processes and 3) understanding the physics of neutrino interactions in matter. We describe the development of the computing infrastructure needed to achieve the physics goals of the experiment by storing, cataloging, reconstructing, simulating, and analyzing \sim 30 PB of data/year from DUNE and its prototypes. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions and advanced algorithms as HEP computing evolves. We describe the physics objectives, organization, use cases, and proposed technical solutions
    corecore